Analytical derivatives of neural networks
نویسندگان
چکیده
We propose a simple recursive algorithm that allows the computation of first- and second-order derivatives with respect to inputs an arbitrary deep feed forward neural network (DFNN). The naturally incorporates parameters. To test algorithm, we apply it study quantum mechanical variational problem for few cases potentials, modeling ground-state wave function in terms DFNN.
منابع مشابه
rodbar dam slope stability analysis using neural networks
در این تحقیق شبکه عصبی مصنوعی برای پیش بینی مقادیر ضریب اطمینان و فاکتور ایمنی بحرانی سدهای خاکی ناهمگن ضمن در نظر گرفتن تاثیر نیروی اینرسی زلزله ارائه شده است. ورودی های مدل شامل ارتفاع سد و زاویه شیب بالا دست، ضریب زلزله، ارتفاع آب، پارامترهای مقاومتی هسته و پوسته و خروجی های آن شامل ضریب اطمینان می شود. مهمترین پارامتر مورد نظر در تحلیل پایداری شیب، بدست آوردن فاکتور ایمنی است. در این تحقیق ...
Cryptography based on neural networks—analytical results
The mutual learning process between two parity feed-forward networks with discrete and continuous weights is studied analytically, and we find that the number of steps required to achieve full synchronization between the two networks in the case of discrete weights is finite. The synchronization process is shown to be non-self-averaging and the analytical solution is based on random auxiliary v...
متن کاملAnalytical investigation of self-organized criticality in neural networks.
Dynamical criticality has been shown to enhance information processing in dynamical systems, and there is evidence for self-organized criticality in neural networks. A plausible mechanism for such self-organization is activity-dependent synaptic plasticity. Here, we model neurons as discrete-state nodes on an adaptive network following stochastic dynamics. At a threshold connectivity, this syst...
متن کاملCritical Points of Neural Networks: Analytical Forms and Landscape Properties
Due to the success of deep learning to solving a variety of challenging machine learning tasks, there is a rising interest in understanding loss functions for training neural networks from a theoretical aspect. Particularly, the properties of critical points and the landscape around them are of importance to determine the convergence performance of optimization algorithms. In this paper, we pro...
متن کاملAnalytical Guarantees on Numerical Precision of Deep Neural Networks
The acclaimed successes of neural networks often overshadow their tremendous complexity. We focus on numerical precision a key parameter defining the complexity of neural networks. First, we present theoretical bounds on the accuracy in presence of limited precision. Interestingly, these bounds can be computed via the back-propagation algorithm. Hence, by combining our theoretical analysis and ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Computer Physics Communications
سال: 2022
ISSN: ['1879-2944', '0010-4655']
DOI: https://doi.org/10.1016/j.cpc.2021.108169